SIC98 / GPT2-python-code-generator
GPT2 finetuning with transformers π€
β27Updated 4 years ago
Alternatives and similar repositories for GPT2-python-code-generator:
Users that are interested in GPT2-python-code-generator are comparing it to the libraries listed below
- A basic and simple tool for code auto completionβ59Updated 7 months ago
- Code Generatorβ23Updated 2 years ago
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AIβ57Updated last year
- A Streamlit app running GPT-2 language model for text classification, built with Pytorch, Transformers and AWS SageMaker.β39Updated 3 years ago
- Magnum-NLC2CMD is the winning solution for the NeurIPS 2020 NLC2CMD challenge.β33Updated 2 years ago
- Observe the slow deterioration of my mental sanity in the github commit historyβ12Updated last year
- β35Updated last year
- Source code for the GPT-2 story generation models in the EMNLP 2020 paper "STORIUM: A Dataset and Evaluation Platform for Human-in-the-Loβ¦β39Updated last year
- β68Updated 2 years ago
- Codebase for the Medium Article on Fine-tuning GPT2 for Text Generationβ69Updated 4 years ago
- Code for the NLP4Prog workshop paper "Reading StackOverflow Encourages Cheating: Adding Question TextImproves Extractive Code Generation"β21Updated 3 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+β37Updated 4 years ago
- A repo for code based language modelsβ18Updated 4 years ago
- Transformers at any scaleβ41Updated last year
- The jiant toolkit for general-purpose text understanding modelsβ22Updated 4 years ago
- Implementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.β32Updated 3 years ago
- Official code release for the paper Coder Reviewer Reranking for Code Generation.β42Updated 2 years ago
- Fine-tuning GPT-2 Small for Question Answeringβ129Updated 2 years ago
- All my experiments with the various transformers and various transformer frameworks availableβ14Updated 3 years ago
- β32Updated 2 years ago
- Script for downloading GitHub.β91Updated 8 months ago
- β20Updated 3 years ago
- The code for the Subformer, from the EMNLP 2021 Findings paper: "Subformer: Exploring Weight Sharing for Parameter Efficiency in Generatiβ¦β14Updated 3 years ago
- β14Updated 3 years ago
- β24Updated 2 years ago
- This repository is the official implementation of our paper MVP: Multi-task Supervised Pre-training for Natural Language Generation.β70Updated 2 years ago
- A extension of Transformers library to include T5ForSequenceClassification class.β38Updated last year
- PERFECT: Prompt-free and Efficient Few-shot Learning with Language Modelsβ109Updated 2 years ago
- Simple and easy stable diffusion inference with LightningModule on GPU, CPU and MPS (Possibly all devices supported by Lightning).β17Updated last year