openai / gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆23,400Updated 8 months ago
Alternatives and similar repositories for gpt-2:
Users that are interested in gpt-2 are comparing it to the libraries listed below
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,978Updated last year
- GPT-3: Language Models are Few-Shot Learners☆15,751Updated 4 years ago
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,527Updated last year
- Unsupervised text tokenizer for Neural Network-based text generation.☆10,836Updated last month
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆31,390Updated 3 months ago
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆21,842Updated 8 months ago
- Tensors and Dynamic neural networks in Python with strong GPU acceleration☆89,697Updated this week
- TensorFlow code and pre-trained models for BERT☆39,098Updated 9 months ago
- Trax — Deep Learning with Clear Code and Speed☆8,203Updated 3 weeks ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆41,081Updated 4 months ago
- 💥 Fast State-of-the-Art Tokenizers optimized for Research and Production☆9,659Updated 3 weeks ago
- An open-source NLP research library, built on PyTorch.☆11,843Updated 2 years ago
- A library for efficient similarity search and clustering of dense vectors.☆34,749Updated this week
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,286Updated 3 years ago
- Open standard for machine learning interoperability☆18,895Updated this week
- An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model c…☆14,179Updated 10 months ago
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,404Updated 2 years ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,343Updated last week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆16,102Updated last year
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,414Updated this week
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆143,804Updated this week
- Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.☆14,464Updated 2 weeks ago
- A toolkit for developing and comparing reinforcement learning algorithms.☆35,928Updated 6 months ago
- Ongoing research training transformer models at scale☆12,261Updated this week
- AutoML library for deep learning☆9,228Updated 4 months ago
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆38,206Updated this week
- 💫 Industrial-strength Natural Language Processing (NLP) in Python☆31,509Updated 3 weeks ago
- Code and model for the paper "Improving Language Understanding by Generative Pre-Training"☆2,208Updated 6 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,172Updated this week
- CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image☆28,780Updated 9 months ago