EleutherAI / gpt-neoLinks
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
☆8,293Updated 3 years ago
Alternatives and similar repositories for gpt-neo
Users that are interested in gpt-neo are comparing it to the libraries listed below
Sorting:
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,286Updated last month
- Model parallel transformers in JAX and Haiku☆6,356Updated 2 years ago
- Repo for external large-scale work☆6,540Updated last year
- Large-scale pretraining for dialogue☆2,403Updated 2 years ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,406Updated 2 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆24,070Updated last year
- ☆1,593Updated 2 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,995Updated last year
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆5,627Updated last year
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,611Updated last year
- A robust Python tool for text-based AI training and generation using GPT-2.☆1,842Updated 2 years ago
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,149Updated 2 years ago
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆9,996Updated 3 weeks ago
- Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch☆11,312Updated last year
- CodeGen is a family of open-source model for program synthesis. Trained on TPU-v4. Competitive with OpenAI Codex.☆5,121Updated 6 months ago
- GPT-3: Language Models are Few-Shot Learners☆15,780Updated 4 years ago
- An implementation of training for GPT2, supports TPUs☆1,424Updated 2 years ago
- Conditional Transformer Language Model for Controllable Generation☆1,883Updated 3 months ago
- ☆4,560Updated last year
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,408Updated 3 months ago
- StableLM: Stability AI Language Models☆15,821Updated last year
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,451Updated last year
- A collection of libraries to optimise AI model performances☆8,372Updated last year
- 🦄 State-of-the-Art Conversational AI with Transfer Learning☆1,750Updated 2 years ago
- Ongoing research training transformer models at scale☆13,312Updated this week
- ☆2,863Updated 2 months ago
- The implementation of DeBERTa☆2,130Updated last year
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆31,738Updated 2 months ago
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Auto…☆15,423Updated this week
- ☆9,018Updated last year