lopuhin / transformer-lm
Transformer language model (GPT-2) with sentencepiece tokenizer
☆164Updated 4 years ago
Alternatives and similar repositories for transformer-lm:
Users that are interested in transformer-lm are comparing it to the libraries listed below
- ☆72Updated 6 years ago
- Convai2 submission.☆294Updated 2 years ago
- Unsupervised Question answering via Cloze Translation☆219Updated 2 years ago
- Unsupervised Statistical Machine Translation☆229Updated 4 years ago
- Method to encode text for GPT-2 to generate text based on provided keywords☆260Updated 4 years ago
- ☆322Updated 2 years ago
- Load GPT-2 checkpoint and generate texts☆127Updated 3 years ago
- A Corpus for Multilingual Document Classification in Eight Languages.☆151Updated 2 years ago
- The code to reproduce results from paper "MultiFiT: Efficient Multi-lingual Language Model Fine-tuning" https://arxiv.org/abs/1909.04761☆283Updated 4 years ago
- Datasets I have created for scientific summarization, and a trained BertSum model☆115Updated 5 years ago
- xfspell — the Transformer Spell Checker☆190Updated 4 years ago
- XLNet for generating language.☆165Updated 4 years ago
- Code for the AllenNLP demo.☆197Updated 2 years ago
- interactive explorer for language models☆133Updated 3 years ago
- ☆604Updated 6 months ago
- EMNLP 2020: "Dialogue Response Ranking Training with Large-Scale Human Feedback Data"☆340Updated 5 months ago
- Pytorch library for end-to-end transformer models training, inference and serving☆70Updated 2 weeks ago
- XLNet: fine tuning on RTX 2080 GPU - 8 GB☆154Updated 5 years ago
- Builds wordpiece(subword) vocabulary compatible for Google Research's BERT☆229Updated 4 years ago
- A tutorial of pertaining Bert on your own dataset using google TPU☆44Updated 5 years ago
- Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Process…