Am1n3e / active-learning-transformerLinks
A hands-on tutorial on how to use Active Learning with Transformer models.
☆15Updated 4 years ago
Alternatives and similar repositories for active-learning-transformer
Users that are interested in active-learning-transformer are comparing it to the libraries listed below
Sorting:
- PERFECT: Prompt-free and Efficient Few-shot Learning with Language Models☆111Updated last month
- Reference implementation for Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model☆45Updated 4 months ago
- Few-shot Learning with Auxiliary Data☆31Updated 2 years ago
- Starbucks: Improved Training for 2D Matryoshka Embeddings☆22Updated 7 months ago
- Embedding Recycling for Language models☆38Updated 2 years ago
- [ICML 2023] Exploring the Benefits of Training Expert Language Models over Instruction Tuning☆98Updated 2 years ago
- ☆44Updated last year
- [ICLR 2023] PyTorch code of Summarization Programs: Interpretable Abstractive Summarization with Neural Modular Trees☆24Updated 2 years ago
- Code repo for "Model-Generated Pretraining Signals Improves Zero-Shot Generalization of Text-to-Text Transformers" (ACL 2023)☆22Updated 2 years ago
- A extension of Transformers library to include T5ForSequenceClassification class.☆40Updated 2 years ago
- Official code and model checkpoints for our EMNLP 2022 paper "RankGen - Improving Text Generation with Large Ranking Models" (https://arx…☆138Updated 2 years ago
- This repository accompanies our paper “Do Prompt-Based Models Really Understand the Meaning of Their Prompts?”☆85Updated 3 years ago
- A package for fine tuning of pretrained NLP transformers using Semi Supervised Learning☆14Updated 4 years ago
- Code for our paper: "GrIPS: Gradient-free, Edit-based Instruction Search for Prompting Large Language Models"☆57Updated 2 years ago
- ☆54Updated 3 years ago
- The official implementation of "Distilling Relation Embeddings from Pre-trained Language Models, EMNLP 2021 main conference", a high-qual…