allenai / tango
Organize your experiments into discrete steps that can be cached and reused throughout the lifetime of your research project.
â538Updated 7 months ago
Alternatives and similar repositories for tango:
Users that are interested in tango are comparing it to the libraries listed below
- Task-based datasets, preprocessing, and evaluation for sequence models.â566Updated this week
- đ¤ A PyTorch library of curated Transformer models and their composable componentsâ873Updated 9 months ago
- Mistral: A strong, northwesterly wind: Framework for transparent and accessible large-scale language model training, built with Hugging FâŚâ568Updated last year
- Flexible components pairing đ¤ Transformers with Pytorch Lightningâ613Updated 2 years ago
- An open collection of implementation tips, tricks and resources for training large language modelsâ466Updated last year
- Fast & Simple repository for pre-training and fine-tuning T5-style modelsâ986Updated 4 months ago
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)â461Updated 2 years ago
- Code for T-Few from "Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning"â438Updated last year
- Interpretable Evaluation for AI Systemsâ361Updated last year
- NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Dayâ253Updated last year
- Code repository for supporting the paper "Atlas Few-shot Learning with Retrieval Augmented Language Models",(https//arxiv.org/abs/2208.03âŚâ524Updated last year
- Repository containing code for "How to Train BERT with an Academic Budget" paperâ310Updated last year
- Code repository for the paper - "Matryoshka Representation Learning"â444Updated 10 months ago
- Interpretability for sequence generation models đ đâ393Updated 2 months ago
- Active Learning for Text Classification in Pythonâ604Updated last week
- Cramming the training of a (BERT-type) language model into limited compute.â1,307Updated 7 months ago
- Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: âŚâ327Updated last year
- Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorchâ857Updated last year
- Contriever: Unsupervised Dense Information Retrieval with Contrastive Learningâ705Updated last year
- Code for the ALiBi method for transformer language models (ICLR 2022)â510Updated last year
- Build, evaluate, understand, and fix LLM-based appsâ484Updated last year
- Run Effective Large Batch Contrastive Learning Beyond GPU/TPU Memory Constraintâ371Updated 9 months ago
- String-to-String Algorithms for Natural Language Processingâ539Updated 5 months ago
- All-in-one text de-duplicationâ648Updated 7 months ago
- AI Data Management & Evaluation Platformâ215Updated last year
- NL-Augmenter đŚ â đ A Collaborative Repository of Natural Language Transformationsâ779Updated 7 months ago
- Model explainability that works seamlessly with đ¤ transformers. Explain your transformers model in just 2 lines of code.â1,316Updated last year
- Original Implementation of Prompt Tuning from Lester, et al, 2021â663Updated last month
- A prize for finding tasks that cause large language models to show inverse scalingâ605Updated last year