openai / gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆23,054Updated 6 months ago
Alternatives and similar repositories for gpt-2:
Users that are interested in gpt-2 are comparing it to the libraries listed below
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,399Updated 2 years ago
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,959Updated last year
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆21,388Updated 6 months ago
- GPT-3: Language Models are Few-Shot Learners☆15,737Updated 4 years ago
- Ongoing research training transformer models at scale☆11,448Updated this week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆36,866Updated this week
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆139,759Updated this week
- Unsupervised text tokenizer for Neural Network-based text generation.☆10,605Updated last week
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆31,018Updated last month
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,270Updated 5 months ago
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,189Updated 6 years ago
- Library for fast text representation and classification.☆26,072Updated 10 months ago
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆15,847Updated last year
- Code and documentation to train Stanford's Alpaca models, and generate the data.☆29,827Updated 7 months ago
- Low-code framework for building custom LLMs, neural networks, and other AI models☆11,328Updated this week
- TensorFlow code and pre-trained models for BERT☆38,669Updated 6 months ago
- Trax — Deep Learning with Clear Code and Speed☆8,163Updated last week
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,101Updated this week
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆9,387Updated this week
- Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.☆14,389Updated 2 weeks ago
- DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Ras…☆25,888Updated 5 months ago
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,502Updated last year
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,182Updated last year
- Fast and memory-efficient exact attention☆15,541Updated this week
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image☆27,419Updated 6 months ago
- An open-source NLP research library, built on PyTorch.☆11,806Updated 2 years ago
- Open Source Neural Machine Translation and (Large) Language Models in PyTorch☆6,823Updated last month
- State-of-the-Art Text Embeddings☆16,018Updated last week
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enter…☆13,951Updated 6 months ago
- Open standard for machine learning interoperability☆18,467Updated this week