SIC98 / GPT2-python-code-generatorLinks
GPT2 finetuning with transformers 🤗
☆28Updated 4 years ago
Alternatives and similar repositories for GPT2-python-code-generator
Users that are interested in GPT2-python-code-generator are comparing it to the libraries listed below
Sorting:
- A basic and simple tool for code auto completion☆60Updated last year
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AI☆56Updated 2 years ago
- Leetcode using AI☆107Updated 4 years ago
- Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"☆70Updated 2 years ago
- Script for downloading GitHub.☆97Updated last year
- Code Generator☆23Updated 2 years ago
- Implementation of TableFormer, Robust Transformer Modeling for Table-Text Encoding, in Pytorch☆39Updated 3 years ago
- Inference script for Meta's LLaMA models using Hugging Face wrapper☆110Updated 2 years ago
- Magnum-NLC2CMD is the winning solution for the NeurIPS 2020 NLC2CMD challenge.☆33Updated 2 years ago
- Code for the NLP4Prog workshop paper "Reading StackOverflow Encourages Cheating: Adding Question TextImproves Extractive Code Generation"☆21Updated 4 years ago
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixing☆50Updated 3 years ago
- Official code release for the paper Coder Reviewer Reranking for Code Generation.☆45Updated 2 years ago
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 2 years ago
- PyTorch – SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models.☆62Updated 3 years ago
- Developing tools to automatically analyze datasets☆75Updated last year
- A minimal TF2 re-implementation of the OpenAI GPT training☆57Updated 4 years ago
- ☆80Updated 8 months ago
- PERFECT: Prompt-free and Efficient Few-shot Learning with Language Models☆110Updated 3 years ago
- An instruction-based benchmark for text improvements.☆143Updated 3 years ago
- ☆24Updated 3 years ago
- Source code for the GPT-2 story generation models in the EMNLP 2020 paper "STORIUM: A Dataset and Evaluation Platform for Human-in-the-Lo…☆38Updated last year
- A Streamlit app running GPT-2 language model for text classification, built with Pytorch, Transformers and AWS SageMaker.☆39Updated 3 years ago
- Google's Meena transformer chatbot implementation☆105Updated 4 years ago
- Exploring finetuning public checkpoints on filter 8K sequences on Pile☆116Updated 2 years ago
- Pretrained Language Models for Source code☆253Updated 4 years ago
- ☆69Updated 2 years ago
- ☆98Updated 2 years ago
- A *tuned* minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆119Updated 4 years ago
- Code for the paper "BERT Loses Patience: Fast and Robust Inference with Early Exit".☆66Updated 4 years ago
- RWKV-v2-RNN trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.☆67Updated 3 years ago