EleutherAI / the-pile
☆1,528Updated last year
Alternatives and similar repositories for the-pile:
Users that are interested in the-pile are comparing it to the libraries listed below
- Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.☆982Updated 6 months ago
- Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch☆857Updated last year
- Open clone of OpenAI's unreleased WebText dataset scraper. This version uses pushshift.io files instead of the API for speed.☆723Updated 2 years ago
- Ongoing research training transformer language models at scale, including: BERT & GPT-2☆1,357Updated 10 months ago
- ☆2,725Updated last week
- Fast Inference Solutions for BLOOM☆563Updated 3 months ago
- ☆1,490Updated 3 months ago
- Tools to download and cleanup Common Crawl data☆981Updated last year
- Toolkit for creating, sharing and using natural language prompts.☆2,747Updated last year
- Crosslingual Generalization through Multitask Finetuning☆524Updated 4 months ago
- Reproduce results and replicate training fo T0 (Multitask Prompted Training Enables Zero-Shot Task Generalization)☆461Updated 2 years ago
- The hub for EleutherAI's work on interpretability and learning dynamics☆2,349Updated last month
- Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models☆2,937Updated 6 months ago
- This repository contains the code for "Exploiting Cloze Questions for Few-Shot Text Classification and Natural Language Inference"☆1,626Updated last year
- Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpe…☆437Updated last year
- BLEURT is a metric for Natural Language Generation based on transfer learning.☆712Updated last year
- A modular RL library to fine-tune language models to human preferences☆2,260Updated 10 months ago
- Expanding natural instructions☆970Updated last year
- Original Implementation of Prompt Tuning from Lester, et al, 2021☆663Updated last month
- MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.☆1,948Updated this week
- ☆1,162Updated 5 months ago
- Measuring Massive Multitask Language Understanding | ICLR 2021☆1,291Updated last year
- Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the…☆1,999Updated 5 months ago
- Human preference data for "Training a Helpful and Harmless Assistant with Reinforcement Learning from Human Feedback"☆1,673Updated last year
- The implementation of DeBERTa☆2,027Updated last year
- Code for "Learning to summarize from human feedback"☆1,003Updated last year
- Kernl lets you run PyTorch transformer models several times faster on GPU with a single line of code, and is designed to be easily hackab…☆1,550Updated 11 months ago
- Cramming the training of a (BERT-type) language model into limited compute.☆1,310Updated 7 months ago
- Crawl BookCorpus☆814Updated last year
- Parallelformers: An Efficient Model Parallelization Toolkit for Deployment☆781Updated last year