shayne-longpre / a-pretrainers-guideLinks
☆72Updated 2 years ago
Alternatives and similar repositories for a-pretrainers-guide
Users that are interested in a-pretrainers-guide are comparing it to the libraries listed below
Sorting:
- The original implementation of Min et al. "Nonparametric Masked Language Modeling" (paper https//arxiv.org/abs/2212.01349)☆158Updated 2 years ago
- Official code and model checkpoints for our EMNLP 2022 paper "RankGen - Improving Text Generation with Large Ranking Models" (https://arx…☆138Updated 2 years ago
- ☆101Updated 2 years ago
- Pipeline for pulling and processing online language model pretraining data from the web☆177Updated 2 years ago
- [ICLR 2023] Guess the Instruction! Flipped Learning Makes Language Models Stronger Zero-Shot Learners☆116Updated 3 months ago
- ☆39Updated last year
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆95Updated 2 years ago
- ☆65Updated 2 years ago
- A Multilingual Replicable Instruction-Following Model☆95Updated 2 years ago
- [AAAI 2024] Investigating the Effectiveness of Task-Agnostic Prefix Prompt for Instruction Following☆78Updated last year
- Apps built using Inspired Cognition's Critique.☆58Updated 2 years ago
- ☆67Updated 3 years ago
- BLOOM+1: Adapting BLOOM model to support a new unseen language☆73Updated last year
- M2D2: A Massively Multi-domain Language Modeling Dataset (EMNLP 2022) by Machel Reid, Victor Zhong, Suchin Gururangan, Luke Zettlemoyer☆54Updated 2 years ago
- The official code of EMNLP 2022, "SCROLLS: Standardized CompaRison Over Long Language Sequences".☆70Updated last year
- ☆97Updated 3 years ago
- A framework for few-shot evaluation of autoregressive language models.☆105Updated 2 years ago
- Tk-Instruct is a Transformer model that is tuned to solve many NLP tasks by following instructions.☆182Updated 2 years ago
- ☆180Updated 2 years ago
- Evaluation pipeline for the BabyLM Challenge 2023.☆77Updated last year
- ☆159Updated 2 years ago
- Open Instruction Generalist is an assistant trained on massive synthetic instructions to perform many millions of tasks☆210Updated last year
- Official repository for our EACL 2023 paper "LongEval: Guidelines for Human Evaluation of Faithfulness in Long-form Summarization" (https…☆44Updated last year
- ☆139Updated 8 months ago
- Repo for ICML23 "Why do Nearest Neighbor Language Models Work?"☆59Updated 2 years ago
- ☆98Updated 2 years ago
- DEMix Layers for Modular Language Modeling☆54Updated 4 years ago
- [ICML 2023] Exploring the Benefits of Training Expert Language Models over Instruction Tuning☆99Updated 2 years ago
- Experiments with generating opensource language model assistants☆97Updated 2 years ago
- SILO Language Models code repository☆82Updated last year