LauraRuis / do-pigs-flyLinks
☆22Updated 2 years ago
Alternatives and similar repositories for do-pigs-fly
Users that are interested in do-pigs-fly are comparing it to the libraries listed below
Sorting:
- ☆101Updated 2 years ago
- ☆72Updated 2 years ago
- Minimum Bayes Risk Decoding for Hugging Face Transformers☆60Updated last year
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆96Updated 2 years ago
- ☆65Updated 2 years ago
- Official code and model checkpoints for our EMNLP 2022 paper "RankGen - Improving Text Generation with Large Ranking Models" (https://arx…☆138Updated 2 years ago
- M2D2: A Massively Multi-domain Language Modeling Dataset (EMNLP 2022) by Machel Reid, Victor Zhong, Suchin Gururangan, Luke Zettlemoyer☆54Updated 3 years ago
- Automatic metrics for GEM tasks☆67Updated 3 years ago
- Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.☆85Updated last year
- Implementation of the paper 'Sentence Bottleneck Autoencoders from Transformer Language Models'☆17Updated 3 years ago
- ☆67Updated 3 years ago
- The original implementation of Min et al. "Nonparametric Masked Language Modeling" (paper https//arxiv.org/abs/2212.01349)☆158Updated 2 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆57Updated 3 years ago
- Apps built using Inspired Cognition's Critique.☆57Updated 2 years ago
- Official repository for our EACL 2023 paper "LongEval: Guidelines for Human Evaluation of Faithfulness in Long-form Summarization" (https…☆44Updated last year
- Simple-to-use scoring function for arbitrarily tokenized texts.☆47Updated 9 months ago
- Pipeline for pulling and processing online language model pretraining data from the web☆178Updated 2 years ago
- Research code for the paper "How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models"☆27Updated 4 years ago
- ☆97Updated 3 years ago
- Pytorch Implementation of EncT5: Fine-tuning T5 Encoder for Non-autoregressive Tasks☆63Updated 3 years ago
- BLOOM+1: Adapting BLOOM model to support a new unseen language☆74Updated last year
- Code and Data for Evaluation WG☆42Updated 3 years ago
- A library for finding knowledge neurons in pretrained transformer models.☆158Updated 3 years ago
- The official code of LM-Debugger, an interactive tool for inspection and intervention in transformer-based language models.☆180Updated 3 years ago
- A library for parameter-efficient and composable transfer learning for NLP with sparse fine-tunings.☆75Updated last year
- Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)☆80Updated 2 years ago
- Interpreting Language Models with Contrastive Explanations (EMNLP 2022 Best Paper Honorable Mention)☆62Updated 3 years ago
- Benchmark API for Multidomain Language Modeling☆25Updated 3 years ago
- This repository accompanies our paper “Do Prompt-Based Models Really Understand the Meaning of Their Prompts?”☆85Updated 3 years ago
- Query-focused summarization data☆42Updated 2 years ago