openai / gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆22,792Updated 5 months ago
Alternatives and similar repositories for gpt-2:
Users that are interested in gpt-2 are comparing it to the libraries listed below
- GPT-3: Language Models are Few-Shot Learners☆15,714Updated 4 years ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆20,810Updated 5 months ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆30,809Updated last week
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,952Updated last year
- TensorFlow code and pre-trained models for BERT☆38,519Updated 5 months ago
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆15,740Updated last year
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,399Updated 2 years ago
- A library for efficient similarity search and clustering of dense vectors.☆32,387Updated this week
- Unsupervised text tokenizer for Neural Network-based text generation.☆10,479Updated last month
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more☆30,985Updated this week
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆137,641Updated this week
- Code and documentation to train Stanford's Alpaca models, and generate the data.☆29,738Updated 6 months ago
- The fastai deep learning library☆26,505Updated last month
- Trax — Deep Learning with Clear Code and Speed☆8,136Updated this week
- Open Source Neural Machine Translation and (Large) Language Models in PyTorch☆6,803Updated last week
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,180Updated last year
- Ongoing research training transformer models at scale☆11,109Updated this week
- Open standard for machine learning interoperability