openai / gpt-2
Code for the paper "Language Models are Unsupervised Multitask Learners"
☆23,421Updated 9 months ago
Alternatives and similar repositories for gpt-2
Users that are interested in gpt-2 are comparing it to the libraries listed below
Sorting:
- GPT-3: Language Models are Few-Shot Learners☆15,753Updated 4 years ago
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.☆31,410Updated 4 months ago
- A library for efficient similarity search and clustering of dense vectors.☆34,852Updated this week
- Unsupervised text tokenizer for Neural Network-based text generation.☆10,857Updated last month
- Dataset of GPT-2 outputs for research in detection, biases, and more☆1,978Updated last year
- Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts☆3,404Updated 2 years ago
- An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.☆8,287Updated 3 years ago
- Tensors and Dynamic neural networks in Python with strong GPU acceleration☆89,831Updated this week
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.☆10,528Updated last year
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.☆29,439Updated last week
- 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.☆144,305Updated this week
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"☆6,349Updated 2 weeks ago
- NLTK Source☆14,035Updated last week
- A toolkit for developing and comparing reinforcement learning algorithms.☆35,944Updated 7 months ago
- TensorFlow code and pre-trained models for BERT☆39,118Updated 9 months ago
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Auto…☆14,317Updated this week
- 💫 Industrial-strength Natural Language Processing (NLP) in Python☆31,526Updated last month
- An open-source NLP research library, built on PyTorch.☆11,846Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries☆7,178Updated this week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.☆38,278Updated this week
- Trax — Deep Learning with Clear Code and Speed☆8,204Updated last month
- Ongoing research training transformer models at scale☆12,310Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆21,866Updated 9 months ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities☆21,217Updated 2 months ago
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enter…☆14,239Updated 9 months ago
- 🤗 The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools☆20,107Updated this week
- Open standard for machine learning interoperability☆18,915Updated this week
- Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch☆5,614Updated last year
- XLNet: Generalized Autoregressive Pretraining for Language Understanding☆6,188Updated last year
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.☆16,118Updated last year