openai / gpt-2Links
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆24,472Updated last year
Alternatives and similar repositories for gpt-2
Users that are interested in gpt-2 are comparing it to the libraries listed below
Sorting:
- Dataset of GPT-2 outputs for research in detection, biases, and more☆2,008Updated 2 years ago
- Unsupervised text tokenizer for Neural Network-based text generation.☆11,508Updated this week
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆32,020Updated 2 months ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆23,160Updated last year
- GPT-3: Language Models are Few-Shot Learners☆15,775Updated 5 years ago
- Ongoing research training transformer models at scale☆14,602Updated this week
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,460Updated last month
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enter…☆14,630Updated last year
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Auto…☆16,271Updated this week
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆10,301Updated 2 weeks ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,405Updated 3 years ago
- 🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal model…☆153,866Updated this week
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,622Updated 2 years ago
- TensorFlow code and pre-trained models for BERT☆39,730Updated last year
- Repo for external large-scale work☆6,547Updated last year
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,286Updated 3 years ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆41,015Updated this week
- Model parallel transformers in JAX and Haiku☆6,357Updated 2 years ago
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆16,818Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,350Updated last week
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,263Updated 6 years ago
- An open-source NLP research library, built on PyTorch.☆11,886Updated 3 years ago
- Development repository for the Triton language and compiler☆17,861Updated this week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)☆7,833Updated 6 months ago
- Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.☆14,641Updated 2 weeks ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,875Updated 5 months ago
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more☆34,299Updated this week
- End-to-End Speech Processing Toolkit☆9,641Updated this week
- Code and documentation to train Stanford's Alpaca models, and generate the data.☆30,252Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"☆1,145Updated 3 years ago