EleutherAI / openwebtext2
☆90Updated 2 years ago
Alternatives and similar repositories for openwebtext2:
Users that are interested in openwebtext2 are comparing it to the libraries listed below
- Python tools for processing the stackexchange data dumps into a text dataset for Language Models☆81Updated last year
- ☆77Updated last year
- Pipeline for pulling and processing online language model pretraining data from the web☆177Updated last year
- Open source library for few shot NLP☆78Updated last year
- XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale☆154Updated last year
- Techniques used to run BLOOM at inference in parallel☆37Updated 2 years ago
- Evaluation suite for large-scale language models.☆125Updated 3 years ago
- The pipeline for the OSCAR corpus☆168Updated last year
- Experiments with generating opensource language model assistants☆97Updated last year
- Repo for training MLMs, CLMs, or T5-type models on the OLM pretraining data, but it should work with any hugging face text dataset.☆93Updated 2 years ago
- ☆111Updated 2 years ago
- Tools for managing datasets for governance and training.☆85Updated 3 months ago
- ☆97Updated 2 years ago
- Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)☆61Updated last year
- ☆97Updated 2 years ago
- An experimental implementation of the retrieval-enhanced language model☆74Updated 2 years ago
- Code and data to support the paper "PAQ 65 Million Probably-Asked Questions andWhat You Can Do With Them"☆202Updated 3 years ago
- This repository contains the code for "Generating Datasets with Pretrained Language Models".☆188Updated 3 years ago
- ☆148Updated 4 years ago
- ☆33Updated last year
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AI☆57Updated last year
- Code for the paper-"Mirostat: A Perplexity-Controlled Neural Text Decoding Algorithm" (https://arxiv.org/abs/2007.14966).☆58Updated 3 years ago
- Tutorial to pretrain & fine-tune a 🤗 Flax T5 model on a TPUv3-8 with GCP☆58Updated 2 years ago
- Open Instruction Generalist is an assistant trained on massive synthetic instructions to perform many millions of tasks☆208Updated last year
- DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.☆167Updated last month
- ☆130Updated 2 years ago
- ☆87Updated 2 years ago
- ☆67Updated 2 years ago
- ☆72Updated last year
- Implementation of Marge, Pre-training via Paraphrasing, in Pytorch☆75Updated 4 years ago