openai / gpt-3
GPT-3: Language Models are Few-Shot Learners
☆15,735Updated 4 years ago
Alternatives and similar repositories for gpt-3:
Users that are interested in gpt-3 are comparing it to the libraries listed below
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆23,033Updated 6 months ago
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,271Updated 2 years ago
- ☆4,580Updated last year
- Model parallel transformers in JAX and Haiku☆6,323Updated 2 years ago
- Repo for external large-scale work☆6,517Updated 9 months ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆21,365Updated 6 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆36,800Updated this week
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,101Updated this week
- PyTorch package for the discrete VAE used for DALL·E.☆10,829Updated last year
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆139,521Updated this week
- The goal of this project is to enable users to create cool web demos using the newly released OpenAI GPT-3 API with just a few lines of P…☆2,888Updated last year
- ☆2,050Updated 2 years ago
- Unsupervised text tokenizer for Neural Network-based text generation.☆10,598Updated last week
- Running large language models on a single GPU for throughput-oriented scenarios.☆9,261Updated 3 months ago
- Instruct-tune LLaMA on consumer hardware☆18,805Updated 6 months ago
- StableLM: Stability AI Language Models☆15,831Updated 10 months ago
- Source code for Twitter's Recommendation Algorithm☆10,189Updated 7 months ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,270Updated 4 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆39,429Updated 2 months ago
- Development repository for the Triton language and compiler☆14,452Updated this week
- An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.☆37,834Updated this week
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,502Updated last year
- Trax — Deep Learning with Clear Code and Speed☆8,163Updated last week
- RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable)…☆13,167Updated this week
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,959Updated last year
- Ongoing research training transformer models at scale☆11,414Updated this week
- DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Ras…☆25,888Updated 5 months ago
- Making large AI models cheaper, faster and more accessible☆39,731Updated this week
- JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf☆23,942Updated 4 months ago
- Code and documentation to train Stanford's Alpaca models, and generate the data.☆29,823Updated 7 months ago