openai / gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
β23,033Updated 6 months ago
Alternatives and similar repositories for gpt-2:
Users that are interested in gpt-2 are comparing it to the libraries listed below
- GPT-3: Language Models are Few-Shot Learnersβ15,735Updated 4 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and moreβ1,959Updated last year
- π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.β139,521Updated this week
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.β8,271Updated 2 years ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingβ21,365Updated 6 months ago
- Unsupervised text tokenizer for Neural Network-based text generation.β10,598Updated last week
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"β6,270Updated 4 months ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new textsβ3,399Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed librariesβ7,101Updated this week
- The simplest, fastest repository for training/finetuning medium-sized GPTs.β39,429Updated 2 months ago
- Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the moβ¦β22,784Updated 6 months ago
- XLNet: Generalized Autoregressive Pretraining for Language Understandingβ6,182Updated last year
- Code for the paper "Jukebox: A Generative Model for Music"β7,924Updated 8 months ago
- TensorFlow code and pre-trained models for BERTβ38,669Updated 6 months ago
- Trax β Deep Learning with Clear Code and Speedβ8,163Updated last week
- Repo for external large-scale workβ6,517Updated 9 months ago
- Ongoing research training transformer models at scaleβ11,414Updated this week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.β15,838Updated last year
- An open-source NLP research library, built on PyTorch.β11,806Updated 2 years ago
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and moreβ31,324Updated this week
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"β2,189Updated 6 years ago
- Open Source Neural Machine Translation and (Large) Language Models in PyTorchβ6,823Updated last month
- π Scalable embedding, reasoning, ranking for images and sentences with CLIPβ12,568Updated last year
- LLM inference in C/C++β74,674Updated this week
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Autoβ¦β13,150Updated this week
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representationsβ3,265Updated last year
- Code for the paper "Language Models are Unsupervised Multitask Learners"β1,148Updated 2 years ago
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β17,363Updated this week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,130Updated last year
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.β10,502Updated last year