SIC98 / GPT2-python-code-generatorLinks
GPT2 finetuning with transformers 🤗
☆28Updated 4 years ago
Alternatives and similar repositories for GPT2-python-code-generator
Users that are interested in GPT2-python-code-generator are comparing it to the libraries listed below
Sorting:
- A basic and simple tool for code auto completion☆59Updated last year
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AI☆56Updated 2 years ago
- Leetcode using AI☆109Updated 4 years ago
- Code Generator☆23Updated 2 years ago
- Official code release for the paper Coder Reviewer Reranking for Code Generation.☆45Updated 2 years ago
- Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"☆70Updated 2 years ago
- Magnum-NLC2CMD is the winning solution for the NeurIPS 2020 NLC2CMD challenge.☆33Updated 2 years ago
- Implementation of TableFormer, Robust Transformer Modeling for Table-Text Encoding, in Pytorch☆39Updated 3 years ago
- Code for the NLP4Prog workshop paper "Reading StackOverflow Encourages Cheating: Adding Question TextImproves Extractive Code Generation"☆21Updated 4 years ago
- An implementation of an autoregressive language model using an improved Transformer and DeepSpeed pipeline parallelism.☆30Updated last week
- Fine-tuning GPT-2 Small for Question Answering☆130Updated 3 years ago
- ☆67Updated last year
- Models and datasets for annotated code search.☆35Updated 2 years ago
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixing☆49Updated 3 years ago
- ☆80Updated 9 months ago
- PyTorch – SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models.☆62Updated 3 years ago
- Developing tools to automatically analyze datasets☆75Updated last year
- Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch☆76Updated 3 years ago
- ☆24Updated 3 years ago
- A extension of Transformers library to include T5ForSequenceClassification class.☆40Updated 2 years ago
- Inference script for Meta's LLaMA models using Hugging Face wrapper☆110Updated 2 years ago
- A *tuned* minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training☆119Updated 4 years ago
- Minimal code to train a Large Language Model (LLM).☆171Updated 3 years ago
- PERFECT: Prompt-free and Efficient Few-shot Learning with Language Models☆111Updated last month
- Evaluation suite for large-scale language models.☆129Updated 4 years ago
- A Streamlit app running GPT-2 language model for text classification, built with Pytorch, Transformers and AWS SageMaker.☆39Updated 3 years ago
- ☆69Updated 2 years ago
- Pretrained Language Models for Source code☆252Updated 4 years ago
- ☆44Updated last year
- Source code for the GPT-2 story generation models in the EMNLP 2020 paper "STORIUM: A Dataset and Evaluation Platform for Human-in-the-Lo…☆38Updated 2 years ago