openai / gpt-3
GPT-3: Language Models are Few-Shot Learners
☆15,750Updated 4 years ago
Alternatives and similar repositories for gpt-3:
Users that are interested in gpt-3 are comparing it to the libraries listed below
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆23,226Updated 7 months ago
- Trax — Deep Learning with Clear Code and Speed☆8,177Updated last month
- PyTorch package for the discrete VAE used for DALL·E.☆10,831Updated last year
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,302Updated 3 weeks ago
- The goal of this project is to enable users to create cool web demos using the newly released OpenAI GPT-3 API with just a few lines of P…☆2,892Updated last year
- This repository contains implementations and illustrative code to accompany DeepMind publications☆13,651Updated this week
- ☆4,576Updated last year
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,399Updated 2 years ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆21,632Updated 7 months ago
- Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch☆11,237Updated 10 months ago
- Repo for external large-scale work☆6,522Updated 10 months ago
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,287Updated 3 years ago
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆5,605Updated last year
- Google Research☆35,186Updated this week
- Unsupervised text tokenizer for Neural Network-based text generation.☆10,725Updated 3 weeks ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,969Updated last year
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆9,509Updated last week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆37,612Updated this week
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,516Updated last year
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,142Updated this week
- A library for efficient similarity search and clustering of dense vectors.☆33,821Updated this week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆15,979Updated last year
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆141,820Updated this week
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,161Updated this week
- Model parallel transformers in JAX and Haiku☆6,320Updated 2 years ago
- Library for fast text representation and classification.☆26,120Updated last year
- Code and documentation to train Stanford's Alpaca models, and generate the data.☆29,889Updated 8 months ago
- TensorFlow code and pre-trained models for BERT☆38,882Updated 8 months ago
- Let us control diffusion models!☆31,805Updated last year
- Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk☆13,621Updated 7 months ago