openai / gpt-3Links
GPT-3: Language Models are Few-Shot Learners
☆15,782Updated 5 years ago
Alternatives and similar repositories for gpt-3
Users that are interested in gpt-3 are comparing it to the libraries listed below
Sorting:
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆24,388Updated last year
- Repo for external large-scale work☆6,546Updated last year
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,288Updated 3 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆2,001Updated last year
- ☆4,555Updated 2 years ago
- The goal of this project is to enable users to create cool web demos using the newly released OpenAI GPT-3 API with just a few lines of P…☆2,887Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,337Updated last month
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆31,952Updated last month
- Model parallel transformers in JAX and Haiku☆6,353Updated 2 years ago
- DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Ras…☆26,656Updated 5 months ago
- 🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal model…☆152,590Updated last week
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆10,232Updated last month
- JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf☆24,449Updated 3 months ago
- Ongoing research training transformer models at scale☆14,225Updated this week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆40,733Updated this week
- ☆2,074Updated 3 years ago
- StableLM: Stability AI Language Models☆15,792Updated last year
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,450Updated 2 weeks ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆22,934Updated last year
- Trax — Deep Learning with Clear Code and Speed☆8,292Updated last month
- Google Research☆36,744Updated last week
- Source code for Twitter's Recommendation Algorithm☆10,402Updated last year
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,619Updated 2 years ago
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆5,630Updated last year
- Unsupervised text tokenizer for Neural Network-based text generation.☆11,441Updated 2 weeks ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,404Updated 2 years ago
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.☆30,470Updated this week
- TensorFlow code and pre-trained models for BERT☆39,672Updated last year
- Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch☆11,335Updated last year
- GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)☆7,678Updated 2 years ago