SIC98 / GPT2-python-code-generator
GPT2 finetuning with transformers π€
β27Updated 3 years ago
Related projects β
Alternatives and complementary repositories for GPT2-python-code-generator
- A basic and simple tool for code auto completionβ58Updated 3 months ago
- Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"β71Updated last year
- Observe the slow deterioration of my mental sanity in the github commit historyβ13Updated last year
- β23Updated 2 years ago
- Code Generatorβ23Updated last year
- Transformers at any scaleβ41Updated 10 months ago
- A Streamlit app running GPT-2 language model for text classification, built with Pytorch, Transformers and AWS SageMaker.β39Updated 2 years ago
- Implementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.β32Updated 2 years ago
- A repo for code based language modelsβ18Updated 3 years ago
- A minimal TF2 re-implementation of the OpenAI GPT trainingβ56Updated 3 years ago
- The source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)β52Updated 2 years ago
- A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+β37Updated 3 years ago
- This repository contains the code for paper Prompting ELECTRA Few-Shot Learning with Discriminative Pre-Trained Models.β45Updated 2 years ago
- A reimplementation of KOSMOS-1 from "Language Is Not All You Need: Aligning Perception with Language Models"β27Updated last year
- β32Updated last year
- NLP Examples using the π€ librariesβ42Updated 3 years ago
- Script for downloading GitHub.β88Updated 4 months ago
- β19Updated 3 years ago
- Comparing M2M and mT5 on a rare language pairs, blog post: https://medium.com/@abdessalemboukil/comparing-facebooks-m2m-to-mt5-in-low-reβ¦β13Updated 3 years ago
- Large Scale Distributed Model Training strategy with Colossal AI and Lightning AIβ58Updated last year
- Implementation of TableFormer, Robust Transformer Modeling for Table-Text Encoding, in Pytorchβ36Updated 2 years ago
- Helper scripts and notes that were used while porting various nlp modelsβ44Updated 2 years ago
- β32Updated 2 years ago
- Source code for the GPT-2 story generation models in the EMNLP 2020 paper "STORIUM: A Dataset and Evaluation Platform for Human-in-the-Loβ¦β39Updated 10 months ago
- Codebase for the Medium Article on Fine-tuning GPT2 for Text Generationβ69Updated 4 years ago
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixingβ47Updated 2 years ago
- Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorchβ72Updated last year
- Implementation of COCO-LM, Correcting and Contrasting Text Sequences for Language Model Pretraining, in Pytorchβ45Updated 3 years ago